Experiments on Combining Classifiers
نویسنده
چکیده
In this paper, experiments on various classifiers and combining these classifiers are done, reported and analyzed. Combining the classifiers means having the single classifiers support each other in making a decision, instead of having only a single classifier’s decision as the final decision. The base experiment involves, both, applying different single classifiers on a dataset and applying the combination of those classifiers on the same dataset. Also, the same experiment is conducted after applying feature reduction, Bagging and Boosting methods separately. The dataset used in our experiments has not been used in any publication yet. The performance of the classifiers in terms of accuracies is compared in all experiments. In the experiments, classifier combinations perform slightly better than the single classifiers in terms of overall accuracy. It is also seen that bad classifiers and features may contain valuable information for performance improvement in terms of accuracy by combining rules.
منابع مشابه
استفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی
This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...
متن کاملExperiments with Classifier Combining Rules
A large experiment on combining classifiers is reported and discussed. It includes, both, the combination of different classifiers on the same feature set and the combination of classifiers on different feature sets. Various fixed and trained combining rules are used. It is shown that there is no overall winning combining rule and that bad classifiers as well as bad feature sets may contain val...
متن کاملImproving Combination Methods of Neural Classifiers Using NCL
In this paper the effect of diversity caused by Negative Correlation Learning (NCL) in the combination of neural classifier is investigated and an efficient way to improve combining performance is presented. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are selected as base ensemble learner and NCL version of them are...
متن کاملIntrusion Detection based on Incremental Combining Classifiers
Intrusion detection (ID) is the task of analysis the event occurring on a network system in order to detect abnormal activity. Intrusion Detection System has increased due to its more constructive working than traditional security mechanisms. As the network data is dynamic in nature, it leads to the problem of incremental learning of dynamic data. Now, combining classifiers is a new method for ...
متن کاملLearn++.NC: Combining Ensemble of Classifiers With Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes
We have previously introduced an incremental learning algorithm Learn(++), which learns novel information from consecutive data sets by generating an ensemble of classifiers with each data set, and combining them by weighted majority voting. However, Learn(++) suffers from an inherent "outvoting" problem when asked to learn a new class omega(new) introduced by a subsequent data set, as earlier ...
متن کامل